Proofs for empirical likelihood with general f -divergences
نویسندگان
چکیده
We study extensions of empirical likelihood where the log likelihood ratio is replaced withgeneral f -divergences (which we call empirical divergences). First, we give a novel, elementaryproof forχd-calibration that does not use duality. We then see how to rigorously prove coveragerates for empirical divergence confidence regions by going beyond the asymptotic expansion forthe profile likelihood. Finally, we extend the high dimensional theory for empirical likelihoodto general f -divergences and show that the Euclidean divergence gives stronger guarantees inhigh dimensional settings.
منابع مشابه
Statistics of Robust Optimization: A Generalized Empirical Likelihood Approach
We study statistical inference and robust solution methods for stochastic optimization prob-lems. We first develop an empirical likelihood framework for stochastic optimization. We showan empirical likelihood theory for Hadamard differentiable functionals with general f -divergencesand give conditions under which T (P ) = infx∈X EP [`(x; ξ)] is Hadamard differentiable. Noting<lb...
متن کاملA study on invariance of f-divergence and its application to speech recognition
Identifying features invariant to certain transformations is a fundamental problem in the fields of signal processing and pattern recognition. This paper explores a family of measures called f -divergences that are invariant to invertible transformations, and studies their application to speech recognition. We provide novel proofs for the sufficiency and necessity of the invariance of f -diverg...
متن کاملModified signed log-likelihood test for the coefficient of variation of an inverse Gaussian population
In this paper, we consider the problem of two sided hypothesis testing for the parameter of coefficient of variation of an inverse Gaussian population. An approach used here is the modified signed log-likelihood ratio (MSLR) method which is the modification of traditional signed log-likelihood ratio test. Previous works show that this proposed method has third-order accuracy whereas the traditi...
متن کاملDivergences and Risks for Multiclass Experiments
Csiszár’s f -divergence is a way to measure the similarity of two probability distributions. We study the extension of f -divergence to more than two distributions to measure their joint similarity. By exploiting classical results from the comparison of experiments literature we prove the resulting divergence satisfies all the same properties as the traditional binary one. Considering the multi...
متن کاملOn Generalized Empirical Likelihood Methods
We introduce estimation and test procedures through divergence minimization for models satisfying linear constraints with unknown parameter. These procedures extend the empirical likelihood (EL) method and share common features with generalized empirical likelihood (GEL) approach. We treat the problems of existence and characterization of the divergence projections of probability measures on se...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 2016